1 00:00:00,000 --> 00:00:09,370 (Music) 2 00:00:09,370 --> 00:00:10,430 TERRY FONG: The Intelligent Robotics Group is 3 00:00:10,430 --> 00:00:12,930 developing new robotics technology to improve the 4 00:00:12,930 --> 00:00:15,770 way that humans can explore the solar system. 5 00:00:15,770 --> 00:00:17,900 A key part of that is looking at how humans and 6 00:00:17,900 --> 00:00:20,430 robots can work together as teams so that 7 00:00:20,430 --> 00:00:22,160 humans can support robots, and that 8 00:00:22,160 --> 00:00:24,370 robots can support humans. 9 00:00:24,370 --> 00:00:28,250 We want to interact with autonomous systems. 10 00:00:28,250 --> 00:00:30,820 We want to be able to create systems that we can 11 00:00:30,820 --> 00:00:34,690 trust, in all kinds of circumstances. 12 00:00:34,690 --> 00:00:36,790 Several years ago when we started working with 13 00:00:36,790 --> 00:00:38,890 these remotely operated robots, 14 00:00:38,890 --> 00:00:40,900 we needed a piece of software that would allow us 15 00:00:40,900 --> 00:00:43,330 to look at the terrain, to look at the sensor data 16 00:00:43,330 --> 00:00:44,370 coming from the robots, and 17 00:00:44,370 --> 00:00:46,860 understand its situation. 18 00:00:46,860 --> 00:00:49,030 And that led to the creation of VERVE. 19 00:00:49,030 --> 00:00:52,270 VERVE is a 3-D robot user interface. 20 00:00:52,270 --> 00:00:55,100 It allows us to see the 3-D world that 21 00:00:55,100 --> 00:00:58,030 the robots are operating in. 22 00:00:58,030 --> 00:01:00,710 It's been used with our K10 planetary rovers, 23 00:01:00,710 --> 00:01:03,060 our K-REX planetary rover, 24 00:01:03,060 --> 00:01:05,260 with SPHERES on the Space Station 25 00:01:05,260 --> 00:01:06,790 and with our new robot, the Astrobee, 26 00:01:06,790 --> 00:01:10,820 which will be on the Space Station in 2017. 27 00:01:10,820 --> 00:01:14,100 In 2013 we carried out a series of tests with 28 00:01:14,100 --> 00:01:17,240 astronauts on the International Space Station, and 29 00:01:17,240 --> 00:01:20,670 in those tests we had astronauts who were flying 30 00:01:20,670 --> 00:01:22,980 200 miles above the Earth remotely operate a 31 00:01:22,980 --> 00:01:25,400 robot, the K10 planetary rover, 32 00:01:25,400 --> 00:01:28,040 here in California. 33 00:01:28,040 --> 00:01:29,580 MARIA BUALAT: You cannot joystick a robot 34 00:01:29,580 --> 00:01:30,980 at the distance we're dealing with 35 00:01:30,980 --> 00:01:33,910 because of time delay. 36 00:01:33,910 --> 00:01:35,850 You need a robot that's very safe, 37 00:01:35,850 --> 00:01:37,450 that can operate on its own, 38 00:01:37,450 --> 00:01:40,070 can complete tasks on its own. 39 00:01:40,070 --> 00:01:41,890 On the other hand, you still want the human in the 40 00:01:41,890 --> 00:01:44,730 loop, because the human brings a lot of experience 41 00:01:44,730 --> 00:01:47,960 and very powerful cognitive ability that can deal 42 00:01:47,960 --> 00:01:49,840 with issues that the autonomy's 43 00:01:49,840 --> 00:01:52,500 not quite ready to handle. 44 00:01:52,500 --> 00:01:55,580 That's why NASA feels it's a very potent 45 00:01:55,580 --> 00:01:58,460 combination to use both the human capability 46 00:01:58,460 --> 00:02:04,680 and the robotic capability together. 47 00:02:04,680 --> 00:02:08,040 TERRY FONG: After our 2013 tests involving an 48 00:02:08,040 --> 00:02:10,540 astronaut on the Space Station remotely operating 49 00:02:10,540 --> 00:02:13,560 a robot here on Earth, we realized the software 50 00:02:13,560 --> 00:02:18,100 could be applied to lots of different uses. 51 00:02:18,100 --> 00:02:20,300 One of those uses happened to be supporting the 52 00:02:20,300 --> 00:02:24,480 operation of an autonomous vehicle. 53 00:02:24,480 --> 00:02:27,760 We were very excited when Nissan was interested in 54 00:02:27,760 --> 00:02:32,270 applying this to self-driving cars. 55 00:02:32,270 --> 00:02:34,180 EUGENE TU: One of the key goals of NASA is to 56 00:02:34,180 --> 00:02:36,640 transfer technology out to the commercial sector 57 00:02:36,640 --> 00:02:38,240 for broader use. 58 00:02:38,240 --> 00:02:40,840 When we engage in these type of partnerships we 59 00:02:40,840 --> 00:02:42,840 have a real opportunity to gain knowledge 60 00:02:42,840 --> 00:02:44,390 from them as well. 61 00:02:44,390 --> 00:02:46,890 Our collaboration with Nissan North America 62 00:02:46,890 --> 00:02:49,480 and with other self-driving car companies is an 63 00:02:49,480 --> 00:02:51,380 example of that. 64 00:02:51,380 --> 00:02:54,230 One of the things we gain is to learn how our 65 00:02:54,230 --> 00:02:56,520 autonomy is used and how humans 66 00:02:56,520 --> 00:02:59,460 interact with that. 67 00:02:59,460 --> 00:03:01,730 The need for autonomy and greater autonomy is 68 00:03:01,730 --> 00:03:03,500 always going be there for NASA missions 69 00:03:03,500 --> 00:03:04,940 in the future. 70 00:03:04,940 --> 00:03:06,720 But seeing it applied in a real world 71 00:03:06,720 --> 00:03:10,040 example of self-driving cars, for example, 72 00:03:10,040 --> 00:03:13,950 we will get that knowledge and benefit as well. 73 00:03:13,950 --> 00:03:15,290 TERRY FONG: I love robots. 74 00:03:15,290 --> 00:03:17,680 I help build and test them. 75 00:03:17,680 --> 00:03:22,030 In the future, I see robots everywhere. 76 00:03:22,030 --> 00:03:24,290 We have the potential of this technology reaching 77 00:03:24,290 --> 00:03:26,190 thousands or millions of people,